Compression of Sparse Matrices

نویسندگان

  • Nicola Galli
  • Bernhard Seybold
چکیده

We consider a simple method to compress a sparse (nn) random matrix into a table of size O(n) which works in expected linear time O(n). The worst case time to access entries in the compressed table is O(1). The compression scheme is based on a random greedy algorithm which places every row of a matrix with n nonzero entries into a table of size n. This minimal table size is achieved with high probability. In case of failure, the table is extended. Experimental results show that the algorithm is very eecient, even for small inputs.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Deblocking Joint Photographic Experts Group Compressed Images via Self-learning Sparse Representation

JPEG is one of the most widely used image compression method, but it causes annoying blocking artifacts at low bit-rates. Sparse representation is an efficient technique which can solve many inverse problems in image processing applications such as denoising and deblocking. In this paper, a post-processing method is proposed for reducing JPEG blocking effects via sparse representation. In this ...

متن کامل

Sampling and Analytical Techniques for Data Distribution of Parallel Sparse Computation

We present a compile{time method to select compression and distribution schemes for sparse matrices which are computed using Fortran 90 array intrinsic operations. The selection process samples input sparse matrices to determine their sparsity structures. It is also guided by cost functions of various sparse routines as measured from the target machine. The Fortran 90 array expression is then t...

متن کامل

Transposition Mechanism for Sparse Matrices on Vector Processors

Many scientific applications involve operations on sparse matrices. However, due to irregularities induced by the sparsity patterns, many operations on sparse matrices execute inefficiently on traditional scalar and vector architectures. To tackle this problem a scheme has been proposed consisting of two parts: (a) An extension to a vector architecture to support sparse matrix-vector multiplica...

متن کامل

Sketching Sparse Covariance Matrices and Graphs

This paper considers the problem of recovering a sparse p×p matrix X given an m×m matrix Y = AXB , where A and B are known m× p matrices with m p. The main result shows that there exist constructions of the “sketching” matrices A and B so that even if X has O(p) non-zeros, it can be recovered exactly and efficiently using convex optimization as long as these non-zeros are not concentrated in an...

متن کامل

Sparse sign-consistent Johnson-Lindenstrauss matrices: compression with neuroscience-based constraints.

Johnson-Lindenstrauss (JL) matrices implemented by sparse random synaptic connections are thought to be a prime candidate for how convergent pathways in the brain compress information. However, to date, there is no complete mathematical support for such implementations given the constraints of real neural tissue. The fact that neurons are either excitatory or inhibitory implies that every so im...

متن کامل

Improved Compressed Sensing Matrixes for Insulator Leakage Current Data Compressing

Insulator fault may lead to the accident of power network, thus the on-line monitoring of insulator is very significant. Low rates wireless network is used for data transmission of leakage current. Determination of the measurement matrix is the significant step for realizing the compressed sensing theory. This article comes up with new sparse matrices which can be used as compressed sensing mat...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1998